58 research outputs found

    On the Security of Lattice-Based Cryptography Against Lattice Reduction and Hybrid Attacks

    Get PDF
    Over the past decade, lattice-based cryptography has emerged as one of the most promising candidates for post-quantum public-key cryptography. For most current lattice-based schemes, one can recover the secret key by solving a corresponding instance of the unique Shortest Vector Problem (uSVP), the problem of finding a shortest non-zero vector in a lattice which is unusually short. This work is concerned with the concrete hardness of the uSVP. In particular, we study the uSVP in general as well as instances of the problem with particularly small or sparse short vectors, which are used in cryptographic constructions to increase their efficiency. We study solving the uSVP in general via lattice reduction, more precisely, the Block-wise Korkine-Zolotarev (BKZ) algorithm. In order to solve an instance of the uSVP via BKZ, the applied block size, which specifies the BKZ algorithm, needs to be sufficiently large. However, a larger block size results in higher runtimes of the algorithm. It is therefore of utmost interest to determine the minimal block size that guarantees the success of solving the uSVP via BKZ. In this thesis, we provide a theoretical and experimental validation of a success condition for BKZ when solving the uSVP which can be used to determine the minimal required block size. We further study the practical implications of using so-called sparsification techniques in combination with the above approach. With respect to uSVP instances with particularly small or sparse short vectors, we investigate so-called hybrid attacks. We first adapt the “hybrid lattice reduction and meet-in-the-middle attack” (or short: the hybrid attack) by Howgrave-Graham on the NTRU encryption scheme to the uSVP. Due to this adaption, the attack can be applied to a larger class of lattice-based cryptosystems. In addition, we enhance the runtime analysis of the attack, e.g., by an explicit calculation of the involved success probabilities. As a next step, we improve the hybrid attack in two directions as described in the following. To reflect the potential of a modern attacker on classical computers, we show how to parallelize the attack. We show that our parallel version of the hybrid attack scales well within realistic parameter ranges. Our theoretical analysis is supported by practical experiments, using our implementation of the parallel hybrid attack which employs Open Multi-Processing and the Message Passing Interface. To reflect the power of a potential future attacker who has access to a large-scale quantum computer, we develop a quantum version of the hybrid attack which replaces the classical meet-in-the-middle search by a quantum search. Not only is the quantum hybrid attack faster than its classical counterpart, but also applicable to a wider range of uSVP instances (and hence to a larger number of lattice-based schemes) as it uses a quantum search which is sensitive to the distribution on the search space. Finally, we demonstrate the practical relevance of our results by using the techniques developed in this thesis to evaluate the concrete security levels of the lattice-based schemes submitted to the US National Institute of Standards and Technology’s process of standardizing post-quantum public-key cryptography

    Revisiting the Hybrid Attack: Improved Analysis and Refined Security Estimates

    Get PDF
    Over the past decade, the hybrid lattice reduction and meet-in-the middle attack (called the Hybrid Attack) has been used to evaluate the security of many lattice-based cryprocraphic schemes such as NTRU, NTRU prime, BLISS, and more. However, unfortunately none of the previous analyses of the Hybrid Attack is entirely satisfactory: they are based on simplifying assumptions that may distort the security estimates. Such simplifying assumptions include setting probabilities equal to 11, which, for the parameter sets we analyze in this work, are in fact as small as 2802^{-80}. Many of these assumptions lead to underestimating the scheme\u27s security. However, some lead to security overestimates, and without further analysis, it is not clear which is the case. Therefore, the current security estimates against the Hybrid Attack are not reliable and the actual security levels of many lattice-based schemes are unclear. In this work we present an improved runtime analysis of the Hybrid Attack that gets rid of incorrect simplifying assumptions. Our improved analysis can be used to derive reliable and accurate security estimates for many lattice-based schemes. In addition, we reevaluate the security against the Hybrid Attack for the NTRU, NTRU prime, and R-BinLWEEnc encryption schemes as well as for the BLISS and GLP signature schemes. Our results show that there exist both security over- and underestimates in the literature. Our results further show that the common claim that the Hybrid Attack is the best attack on all NTRU parameter sets is in fact a misconception based on incorrect analyses of the attack

    Revisiting the Expected Cost of Solving uSVP and Applications to LWE

    Get PDF
    Abstract: Reducing the Learning with Errors problem (LWE) to the Unique-SVP problem and then applying lattice reduction is a commonly relied-upon strategy for estimating the cost of solving LWE-based constructions. In the literature, two different conditions are formulated under which this strategy is successful. One, widely used, going back to Gama & Nguyen\u27s work on predicting lattice reduction (Eurocrypt 2008) and the other recently outlined by Alkim et al. (USENIX 2016). Since these two estimates predict significantly different costs for solving LWE parameter sets from the literature, we revisit the Unique-SVP strategy. We present empirical evidence from lattice-reduction experiments exhibiting a behaviour in line with the latter estimate. However, we also observe that in some situations lattice-reduction behaves somewhat better than expected from Alkim et al.\u27s work and explain this behaviour under standard assumptions. Finally, we show that the security estimates of some LWE-based constructions from the literature need to be revised and give refined expected solving costs

    Recovering Short Generators of Principal Fractional Ideals in Cyclotomic Fields of Conductor pαqβp^\alpha q^\beta

    Get PDF
    Several recent cryptographic constructions - including a public key encryption scheme, a fully homomorphic encryption scheme, and a candidate multilinear map construction - rely on the hardness of the short generator principal ideal problem (SG-PIP): given a Z\mathbb{Z}-basis of some principal (fractional) ideal in an algebraic number field that is guaranteed to have an exceptionally short generator with respect to the logarithmic embedding, find a shortest generator of the principal ideal. The folklore approach to solve this problem is to split it into two subproblems. First, recover some arbitrary generator of the ideal, which is known as the principal ideal problem (PIP). Second, solve a bounded distance decoding (BDD) problem in the log-unit lattice to transform this arbitrary generator into a shortest generator of the ideal. The first problem, i.e., solving the PIP, is known to be solvable in polynomial time on quantum computers for arbitrary number fields under the generalized Riemann hypothesis due to Biasse and Song. Cramer, Ducas, Peikert, and Regev showed, based on the work of Campbell, Groves, and Shepherd, that the second problem can be solved in polynomial time on classical computers for cyclotomic number fields of prime-power conductor. In this work, we extend the work of Cramer, Ducas, Peikert, and Regev to cyclotomic number fields K=Q(ξm)K=\mathbb{Q}(\xi_m) of conductor m=pαqβm=p^\alpha q^\beta, where p,qp,q are distinct odd primes. In more detail, we show that the second problem can be solved in classical polynomial time (with quantum polynomial time precomputation) under some sufficient conditions, if (p,q)(p,q) is an (α,β)(\alpha, \beta)-generator prime pair, a new notion introduced in this work. We further provide experimental evidence that suggests that roughly 35%35\% of all prime pairs are (α,β)(\alpha, \beta)-generator prime pairs for all α\alpha and β\beta. Combined with the work of Biasse and Song our results show that under sufficient conditions the SG-PIP can be solved in quantum polynomial time in cyclotomic number fields of composite conductor of the form pαqβp^\alpha q^\beta

    Re-structuring of marine communities exposed to environmental change

    Get PDF
    Species richness is the most commonly used but controversial biodiversity metric in studies on aspects of community stability such as structural composition or productivity. The apparent ambiguity of theoretical and experimental findings may in part be due to experimental shortcomings and/or heterogeneity of scales and methods in earlier studies. This has led to an urgent call for improved and more realistic experiments. In a series of experiments replicated at a global scale we translocated several hundred marine hard bottom communities to new environments simulating a rapid but moderate environmental change. Subsequently, we measured their rate of compositional change (re-structuring) which in the great majority of cases represented a compositional convergence towards local communities. Re-structuring is driven by mortality of community components (original species) and establishment of new species in the changed environmental context. The rate of this re-structuring was then related to various system properties. We show that availability of free substratum relates negatively while taxon richness relates positively to structural persistence (i.e., no or slow re-structuring). Thus, when faced with environmental change, taxon-rich communities retain their original composition longer than taxon-poor communities. The effect of taxon richness, however, interacts with another aspect of diversity, functional richness. Indeed, taxon richness relates positively to persistence in functionally depauperate communities, but not in functionally diverse communities. The interaction between taxonomic and functional diversity with regard to the behaviour of communities exposed to environmental stress may help understand some of the seemingly contrasting findings of past research

    Re-Structuring of Marine Communities Exposed to Environmental Change: A Global Study on the Interactive Effects of Species and Functional Richness

    Get PDF
    Species richness is the most commonly used but controversial biodiversity metric in studies on aspects of community stability such as structural composition or productivity. The apparent ambiguity of theoretical and experimental findings may in part be due to experimental shortcomings and/or heterogeneity of scales and methods in earlier studies. This has led to an urgent call for improved and more realistic experiments. In a series of experiments replicated at a global scale we translocated several hundred marine hard bottom communities to new environments simulating a rapid but moderate environmental change. Subsequently, we measured their rate of compositional change (re-structuring) which in the great majority of cases represented a compositional convergence towards local communities. Re-structuring is driven by mortality of community components (original species) and establishment of new species in the changed environmental context. The rate of this re-structuring was then related to various system properties. We show that availability of free substratum relates negatively while taxon richness relates positively to structural persistence (i.e., no or slow re-structuring). Thus, when faced with environmental change, taxon-rich communities retain their original composition longer than taxon-poor communities. The effect of taxon richness, however, interacts with another aspect of diversity, functional richness. Indeed, taxon richness relates positively to persistence in functionally depauperate communities, but not in functionally diverse communities. The interaction between taxonomic and functional diversity with regard to the behaviour of communities exposed to environmental stress may help understand some of the seemingly contrasting findings of past research

    Re-structuring of marine communities exposed to environmental change: a global study on the interactive effects of species and functional richness

    Get PDF
    Species richness is the most commonly used but controversial biodiversity metric in studies on aspects of community stability such as structural composition or productivity. The apparent ambiguity of theoretical and experimental findings may in part be due to experimental shortcomings and/or heterogeneity of scales and methods in earlier studies. This has led to an urgent call for improved and more realistic experiments. In a series of experiments replicated at a global scale we translocated several hundred marine hard bottom communities to new environments simulating a rapid but moderate environmental change. Subsequently, we measured their rate of compositional change (re-structuring) which in the great majority of cases represented a compositional convergence towards local communities. Re-structuring is driven by mortality of community components (original species) and establishment of new species in the changed environmental context. The rate of this re-structuring was then related to various system properties. We show that availability of free substratum relates negatively while taxon richness relates positively to structural persistence (i.e., no or slow re-structuring). Thus, when faced with environmental change, taxon-rich communities retain their original composition longer than taxon-poor communities. The effect of taxon richness, however, interacts with another aspect of diversity, functional richness. Indeed, taxon richness relates positively to persistence in functionally depauperate communities, but not in functionally diverse communities. The interaction between taxonomic and functional diversity with regard to the behaviour of communities exposed to environmental stress may help understand some of the seemingly contrasting findings of past research.Mercator Stiftung via GAMEPostprint4,41

    The 2020 UV emitter roadmap

    Get PDF
    Solid state UV emitters have many advantages over conventional UV sources. The (Al,In,Ga)N material system is best suited to produce LEDs and laser diodes from 400 nm down to 210 nm—due to its large and tuneable direct band gap, n- and p-doping capability up to the largest bandgap material AlN and a growth and fabrication technology compatible with the current visible InGaN-based LED production. However AlGaN based UV-emitters still suffer from numerous challenges compared to their visible counterparts that become most obvious by consideration of their light output power, operation voltage and long term stability. Most of these challenges are related to the large bandgap of the materials. However, the development since the first realization of UV electroluminescence in the 1970s shows that an improvement in understanding and technology allows the performance of UV emitters to be pushed far beyond the current state. One example is the very recent realization of edge emitting laser diodes emitting in the UVC at 271.8 nm and in the UVB spectral range at 298 nm. This roadmap summarizes the current state of the art for the most important aspects of UV emitters, their challenges and provides an outlook for future developments

    On the Security of Lattice-Based Cryptography Against Lattice Reduction and Hybrid Attacks

    No full text
    Over the past decade, lattice-based cryptography has emerged as one of the most promising candidates for post-quantum public-key cryptography. For most current lattice-based schemes, one can recover the secret key by solving a corresponding instance of the unique Shortest Vector Problem (uSVP), the problem of finding a shortest non-zero vector in a lattice which is unusually short. This work is concerned with the concrete hardness of the uSVP. In particular, we study the uSVP in general as well as instances of the problem with particularly small or sparse short vectors, which are used in cryptographic constructions to increase their efficiency. We study solving the uSVP in general via lattice reduction, more precisely, the Block-wise Korkine-Zolotarev (BKZ) algorithm. In order to solve an instance of the uSVP via BKZ, the applied block size, which specifies the BKZ algorithm, needs to be sufficiently large. However, a larger block size results in higher runtimes of the algorithm. It is therefore of utmost interest to determine the minimal block size that guarantees the success of solving the uSVP via BKZ. In this thesis, we provide a theoretical and experimental validation of a success condition for BKZ when solving the uSVP which can be used to determine the minimal required block size. We further study the practical implications of using so-called sparsification techniques in combination with the above approach. With respect to uSVP instances with particularly small or sparse short vectors, we investigate so-called hybrid attacks. We first adapt the “hybrid lattice reduction and meet-in-the-middle attack” (or short: the hybrid attack) by Howgrave-Graham on the NTRU encryption scheme to the uSVP. Due to this adaption, the attack can be applied to a larger class of lattice-based cryptosystems. In addition, we enhance the runtime analysis of the attack, e.g., by an explicit calculation of the involved success probabilities. As a next step, we improve the hybrid attack in two directions as described in the following. To reflect the potential of a modern attacker on classical computers, we show how to parallelize the attack. We show that our parallel version of the hybrid attack scales well within realistic parameter ranges. Our theoretical analysis is supported by practical experiments, using our implementation of the parallel hybrid attack which employs Open Multi-Processing and the Message Passing Interface. To reflect the power of a potential future attacker who has access to a large-scale quantum computer, we develop a quantum version of the hybrid attack which replaces the classical meet-in-the-middle search by a quantum search. Not only is the quantum hybrid attack faster than its classical counterpart, but also applicable to a wider range of uSVP instances (and hence to a larger number of lattice-based schemes) as it uses a quantum search which is sensitive to the distribution on the search space. Finally, we demonstrate the practical relevance of our results by using the techniques developed in this thesis to evaluate the concrete security levels of the lattice-based schemes submitted to the US National Institute of Standards and Technology’s process of standardizing post-quantum public-key cryptography
    corecore